conceptual compression
Towards Conceptual Compression
We introduce convolutional DRAW, a homogeneous deep generative model achieving state-of-the-art performance in latent variable image modeling. The algorithm naturally stratifies information into higher and lower level details, creating abstract features and as such addressing one of the fundamentally desired properties of representation learning. Furthermore, the hierarchical ordering of its latents creates the opportunity to selectively store global information about an image, yielding a high quality'conceptual compression' framework.
The Second Machine Turn: From Checking Proofs to Creating Concepts
We identify a second machine turn in the process of mathematical discovery: after automating proof-checking, AI is now poised to automate the *creation* of mathematical concepts themselves. We discuss the current state of the art, obstacles and potential solutions as well as a preliminary attempt at mathematizing the creation of concepts itself. The paper ends with an assessment of how these capabilities could reshape mathematics and human-machine collaboration, and a few different futures we might find ourselves in.
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
- Information Technology > Artificial Intelligence > Cognitive Science > Problem Solving (0.46)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.46)
Towards Conceptual Compression
We introduce convolutional DRAW, a homogeneous deep generative model achieving state-of-the-art performance in latent variable image modeling. The algorithm naturally stratifies information into higher and lower level details, creating abstract features and as such addressing one of the fundamentally desired properties of representation learning. Furthermore, the hierarchical ordering of its latents creates the opportunity to selectively store global information about an image, yielding a high quality'conceptual compression' framework.
Towards Conceptual Compression
Gregor, Karol, Besse, Frederic, Rezende, Danilo Jimenez, Danihelka, Ivo, Wierstra, Daan
We introduce convolutional DRAW, a homogeneous deep generative model achieving state-of-the-art performance in latent variable image modeling. The algorithm naturally stratifies information into higher and lower level details, creating abstract features and as such addressing one of the fundamentally desired properties of representation learning. Furthermore, the hierarchical ordering of its latents creates the opportunity to selectively store global information about an image, yielding a high quality'conceptual compression' framework. Papers published at the Neural Information Processing Systems Conference.
Ego-motion in Self-Aware Deep Learning – Intuition Machine – Medium
We are now in the middle of 2018 and Deep Learning research is advancing at exponential rates. At the beginning of this year, I made 10 predictions on what to expect for the year. Making predictions and comparing them in hindsight is one way to determine if ones expectations are overshooting reality. It turns out, it's undershooting reality in one aspect. I had not expected to see this much new research in "self-awareness". It's a consensus understanding that self-awareness in machines can lead to more autonomous machines and ultimately into machines with consciousness.
Towards Conceptual Compression
Gregor, Karol, Besse, Frederic, Rezende, Danilo Jimenez, Danihelka, Ivo, Wierstra, Daan
We introduce convolutional DRAW, a homogeneous deep generative model achieving state-of-the-art performance in latent variable image modeling. The algorithm naturally stratifies information into higher and lower level details, creating abstract features and as such addressing one of the fundamentally desired properties of representation learning. Furthermore, the hierarchical ordering of its latents creates the opportunity to selectively store global information about an image, yielding a high quality 'conceptual compression' framework.